AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Deepseek-v3 distillation

# Deepseek-v3 distillation

Virtuoso Medium V2
Apache-2.0
A 32-billion-parameter language model based on Qwen-2.5-32B architecture, trained through Deepseek-v3 distillation, demonstrating excellent performance in multiple benchmarks.
Large Language Model Transformers
V
arcee-ai
412
51
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase